Iterative Semi-Supervised Learning Using Softmax Probability

نویسندگان

چکیده

For the classification problem in practice, one of challenging issues is to obtain enough labeled data for training. Moreover, even if such has been sufficiently accumulated, most datasets often exhibit long-tailed distribution with heavy class imbalance, which results a biased model towards majority class. To alleviate semi-supervised learning methods using additional unlabeled have considered. However, as matter course, accuracy much lower than that from supervised learning. In this study, under assumption available, we propose iterative algorithms, iteratively correct labeling extra based on softmax probabilities. The show proposed algorithms provide high validate tested two scenarios: balanced dataset and imbalanced dataset. Under both scenarios, our provided higher previous state-of-the-arts. Code available at .

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Multi-softmax deep neural network for semi-supervised training

In this paper we propose a Shared Hidden Layer Multisoftmax Deep Neural Network (SHL-MDNN) approach for semi-supervised training (SST). This approach aims to boost low-resource speech recognition where limited training data is available. Supervised data and unsupervised data share the same hidden layers but are fed into different softmax layers so that erroneous automatic speech recognition (AS...

متن کامل

Iterative Double Clustering for Unsupervised and Semi-Supervised Learning

We present a powerful meta-clustering technique called Iterative Double Clustering (IDC). The IDC method is a natural extension of the recent Double Clustering (DC) method of Slonim and Tishby that exhibited impressive performance on text categorization tasks [12]. Using synthetically generated data we empirically find that whenever the DC procedure is successful in recovering some of the struc...

متن کامل

Semi-Supervised Metric Learning Using Pairwise Constraints

Distance metric has an important role in many machine learning algorithms. Recently, metric learning for semi-supervised algorithms has received much attention. For semi-supervised clustering, usually a set of pairwise similarity and dissimilarity constraints is provided as supervisory information. Until now, various metric learning methods utilizing pairwise constraints have been proposed. The...

متن کامل

Active Semi-Supervised Learning using Submodular Functions

We consider active, semi-supervised learning in an offline transductive setting. We show that a previously proposed error bound for active learning on undirected weighted graphs can be generalized by replacing graph cut with an arbitrary symmetric submodular function. Arbitrary non-symmetric submodular functions can be used via symmetrization. Different choices of submodular functions give diff...

متن کامل

Semi-supervised learning using greedy max-cut

Graph-based semi-supervised learning (SSL) methods play an increasingly important role in practical machine learning systems, particularly in agnostic settings when no parametric information or other prior knowledge is available about the data distribution. Given the constructed graph represented by a weight matrix, transductive inference is used to propagate known labels to predict the values ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Computers, materials & continua

سال: 2022

ISSN: ['1546-2218', '1546-2226']

DOI: https://doi.org/10.32604/cmc.2022.028154